Modeling Short Over-Dispersed Spike-Train Data: A Hierarchical Parametric Empirical Bayes Framework
نویسندگان
چکیده
In this letter, a Hierarchical Parametric Empirical Bayes model is proposed to model spike count data. We have integrated Generalized Linear Models (GLMs) and empirical Bayes theory to simultaneously provide three advantages: (1) a model of over-dispersion of spike count values; (2) reduced MSE in estimation when compared to using the maximum likelihood method for GLMs; and (3) an efficient alternative to inference with fully Bayes estimators. We apply the model to study both simulated data and experimental neural data from the retina. The simulation results indicate that the new model can estimate both the weights of connections among neural populations and the output firing rates (mean spike count) efficiently and accurately. The results from the retinal datasets show that the proposed model outperforms both standard Poisson and Negative Binomial GLMs in terms of the prediction log-likelihood of held-out datasets.
منابع مشابه
Spike train entropy-rate estimation using hierarchical Dirichlet process priors
Entropy rate quantifies the amount of disorder in a stochastic process. For spiking neurons, the entropy rate places an upper bound on the rate at which the spike train can convey stimulus information, and a large literature has focused on the problem of estimating entropy rate from spike train data. Here we present Bayes least squares and empirical Bayesian entropy rate estimators for binary s...
متن کاملParametric Empirical Bayes Test and Its Application to Selection of Wavelet Threshold
In this article, we propose a new method for selecting level dependent threshold in wavelet shrinkage using the empirical Bayes framework. We employ both Bayesian and frequentist testing hypothesis instead of point estimation method. The best test yields the best prior and hence the more appropriate wavelet thresholds. The standard model functions are used to illustrate the performance of the p...
متن کاملBayesian entropy estimation for binary spike train data using parametric prior knowledge
Shannon’s entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes. Estimating the entropy of a discrete distribution from samples is an important and difficult problem that has received considerable attention in statistics and theoretical neuroscience. However, neural responses have characteristic statistical structure that generic en...
متن کاملA Bayesian supervised dual-dimensionality reduction model for simultaneous decoding of LFP and spike train signals.
Neuroscientists are increasingly collecting multimodal data during experiments and observational studies. Different data modalities-such as EEG, fMRI, LFP, and spike trains-offer different views of the complex systems contributing to neural phenomena. Here, we focus on joint modeling of LFP and spike train data, and present a novel Bayesian method for neural decoding to infer behavioral and exp...
متن کاملA Semiparametric Bayesian Model for Detecting Synchrony Among Multiple Neurons
We propose a scalable semiparametric Bayesian model to capture dependencies among multiple neurons by detecting their cofiring (possibly with some lag time) patterns over time. After discretizing time so there is at most one spike at each interval, the resulting sequence of 1s (spike) and 0s (silence) for each neuron is modeled using the logistic function of a continuous latent variable with a ...
متن کامل